On Linear Embeddings and Unsupervised Feature Learning
نویسنده
چکیده
The ability to train deep architectures has led to many developments in parametric, non-linear dimensionality reduction but with little attention given to algorithms based on convolutional feature extraction without backpropagation training. This paper aims to fill this gap in the context of supervised Mahalanobis metric learning. Modifying two existing approaches to model latent space similarities with a Student’s t-distribution, we obtain competitive classification performance on CIFAR-10 and STL-10 with k-NN in a 50-dimensional space compared with a linear SVM with significantly more features. Using simple modifications to existing feature extraction pipelines, we obtain an error of 0.40% on MNIST, the best reported result without appending distortions for training.
منابع مشابه
Robust Text Classification for Sparsely Labelled Data Using Multi-level Embeddings
The conventional solution for handling sparsely labelled data is extensive feature engineering. This is time consuming and task and domain specific. We present a novel approach for learning embedded features that aims to alleviate this problem. Our approach jointly learns embeddings at different levels of granularity (word, sentence and document) along with the class labels. The intuition is th...
متن کاملEnhancing Feature Selection Using Word Embeddings
Health surveillance systems based on online user-generated content often rely on the identification of textual markers that are related to a target disease. Given the high volume of available data, these systems benefit from an automatic feature selection process. This is accomplished either by applying statistical learning techniques, which do not consider the semantic relationship between the...
متن کاملNon-Linear Smoothed Transductive Network Embedding with Text Information
Network embedding is a classical task which aims to map the nodes of a network to lowdimensional vectors. Most of the previous network embedding methods are trained in an unsupervised scheme. Then the learned node embeddings can be used as inputs of many machine learning tasks such as node classification, attribute inference. However, the discriminant power of the node embeddings maybe improved...
متن کاملEnhancing Feature Selection Using Word Embeddings: The Case of Flu Surveillance
Health surveillance systems based on online user-generated content often rely on the identification of textual markers that are related to a target disease. Given the high volume of available data, these systems benefit from an automatic feature selection process. This is accomplished either by applying statistical learning techniques, which do not consider the semantic relationship between the...
متن کاملUnsupervised Locally Linear Embedding for Dimension Reduction
In this paper, Locally Linear Embedding (LLE) has been implemented for unsupervised non-linear dimension reduction that computes low dimensional, neighborhood preserving embeddings of high dimensional data. Inputs are mapped into a single global coordinate system of lower dimensionality, and its optimizations though capable of generating highly nonlinear embeddings but local minima are not invo...
متن کامل